Goto

Collaborating Authors

 Northern Borders Province


Mixed precision accumulation for neural network inference guided by componentwise forward error analysis

arXiv.org Artificial Intelligence

Mixed precision accumulation for neural network inference guided by componentwise forward error analysis El-Mehdi El arar 1, Silviu-Ioan Filip 1, Theo Mary 2, and Elisa Riccietti 3 1 Inria, IRISA, Universit e de Rennes, 263 Av. G en eral Leclerc, F-35000, Rennes, France 2 Sorbonne Universit e, CNRS, LIP6, 4 Place Jussieu, F-75005, Paris, France 3 ENS de Lyon, CNRS, Inria, Universit e Claude Bernard Lyon 1 LIP, UMR 5668, 69342, Lyon cedex 07, France Abstract This work proposes a mathematically founded mixed precision accumulation strategy for the inference of neural networks. Our strategy is based on a new componentwise forward error analysis that explains the propagation of errors in the forward pass of neural networks. Specifically, our analysis shows that the error in each component of the output of a layer is proportional to the condition number of the inner product between the weights and the input, multiplied by the condition number of the activation function. These condition numbers can vary widely from one component to the other, thus creating a significant opportunity to introduce mixed precision: each component should be accumulated in a precision inversely proportional to the product of these condition numbers. We propose a practical algorithm that exploits this observation: it first computes all components in low precision, uses this output to estimate the condition numbers, and recomputes in higher precision only the components associated with large condition numbers. We test our algorithm on various networks and datasets and confirm experimentally that it can significantly improve the cost-accuracy tradeoff compared with uniform precision accumulation baselines. Keywords: Neural network, inference, error analysis, mixed precision, multiply-accumulate 1 Introduction Modern applications in artificial intelligence require increasingly complex models and thus increasing memory, time, and energy costs for storing and deploying large-scale deep learning models with parameter counts ranging in the millions and billions. This is a limiting factor both in the context of training and of inference. While the growing training costs can be tackled by the power of modern computing resources, notably GPU accelerators, the deployment of large-scale models leads to serious limitations in inference contexts with limited resources, such as embedded systems or applications that require real-time processing.


A futuristic networking platform launching in UAE

#artificialintelligence

Quid Pro Quo (QPQ) International, an innovative networking platform embraces three biggest trends of the future - sharing economy, Artificial Intelligence and blockchain principles Dubai, UAE: QPQ International, an innovative online networking platform that embraces three of the biggest trends of the future including sharing economy, artificial intelligence and blockchain principles would be launching in the UAE this week. The state-of-the-art platform will cater to established business owners, entrepreneurs, SME's, influencers and high-net worth individuals, to help them network and collaborate on investments, projects, and grow their business in a safe, secure and trusted environment. The new business referral network is the result of research and development by serial entrepreneur, Oksana Tashakova, who is also the founder of Wealth Dynamics Unlimited, a premier entrepreneurship education company. QPQ has been Co-Founded by Omair Saoud Arar Al Dhaheri, who has spent 22 years as Director of Real Estate at the Abu Dhabi Investment Authority (ADIA), Chairman of Bin Arar Holding & Midein Energy of the UAE along with Global Strategic Partner Jason Michaud, who is also a partner at Canada-based Luiza Artificial Intelligence Technologies. Ahead of the official launch of the platform, Oksana Tashakova, Co-founder of QPQ said: "In today's world, networking is a necessity.